Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parquet Modular Encryption support #6637

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

rok
Copy link
Member

@rok rok commented Oct 28, 2024

Which issue does this PR close?

This PR is based on branch and an internal patch and aims to provide basic modular encryption support. Closes #3511.

Rationale for this change

See #3511.

What changes are included in this PR?

TBD

Are there any user-facing changes?

TBD

@github-actions github-actions bot added the parquet Changes to the parquet crate label Oct 28, 2024
@rok
Copy link
Member Author

rok commented Oct 28, 2024

Currently this is a rough rebase of work done by @ggershinsky. As ParquetMetaDataReader is now available some refactoring will be required.

@etseidl
Copy link
Contributor

etseidl commented Oct 28, 2024

As ParquetMetaDataReader is now available some refactoring will be required.

@rok let me know if you want any help shoehorning this into ParquetMetaDataReader.

@brainslush
Copy link

Is there any help, input or contribution needed here?

@rok
Copy link
Member Author

rok commented Nov 21, 2024

Thanks for the offer @etseidl & @brainslush! I'm making some progress and would definitely appreciate a review! I'll ping once I push.

@rok rok force-pushed the decryption-basics-fork branch from 7faac72 to 6f055f9 Compare November 23, 2024 21:56
@rok rok force-pushed the decryption-basics-fork branch from fe488b3 to d263510 Compare November 23, 2024 23:06
@rok
Copy link
Member Author

rok commented Dec 4, 2024

As ParquetMetaDataReader is now available some refactoring will be required.

@rok let me know if you want any help shoehorning this into ParquetMetaDataReader.

@etseidl could you please do a quick pass to say if this makes sense in respect to ParquetMetaDataReader?
I'll continue with data decryption.

Copy link
Contributor

@etseidl etseidl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only looking at the metadata bits for now...looks good to me so far. Just a few minor nits. Thanks @rok!

@@ -52,13 +53,16 @@ pub fn parse_metadata<R: ChunkReader>(chunk_reader: &R) -> Result<ParquetMetaDat
/// Decodes [`ParquetMetaData`] from the provided bytes.
///
/// Typically this is used to decode the metadata from the end of a parquet
/// file. The format of `buf` is the Thift compact binary protocol, as specified
/// file. The format of `buf` is the Thrift compact binary protocol, as specified
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❤️

/// by the [Parquet Spec].
///
/// [Parquet Spec]: https://github.com/apache/parquet-format#metadata
#[deprecated(since = "53.1.0", note = "Use ParquetMetaDataReader::decode_metadata")]
pub fn decode_metadata(buf: &[u8]) -> Result<ParquetMetaData> {
ParquetMetaDataReader::decode_metadata(buf)
pub fn decode_metadata(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure we should be updating a deprecated function. If encryption is desired I'd say force use of the new API so we don't have to maintain this one. Just pass None to ParquetMetaDataReader::decode_metadata.

&mut fetch,
file_size,
self.get_prefetch_size(),
self.file_decryption_properties.clone(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very minor nit: I understand that file_decryption_properties needs to be cloned eventually...just wondering if we could pass references down into decode_metadata and do the clone there where it's more obviously needed.

@rok rok force-pushed the decryption-basics-fork branch from f90d8b4 to 29d55eb Compare December 16, 2024 23:51
)?;

// todo: This currently fails, possibly due to wrongly generated AAD
let buf = file_decryptor.decrypt(prot.read_bytes()?.as_slice(), aad.as_ref());
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ggershinsky I'm stuck on decrypting a page header here. Anything obvious you notice that I'm doing wrong?
(My test case is uniformly encrypted parquet file from the parque test data. Encrypted with AES_GCM_V1 I believe. Spec for reference.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Rok, I had a look at this and noticed a couple of problems. One minor issue is that the module type is hard coded to be a dictionary page header, but it looks like the first page encountered should be a data page header just based on debugging reading the same file from C++. And then in create_module_aad, the page index is written as an i32 but it should be an i16. Kind of related, there are a bunch of checks there like row_group_ordinal > i16::MAX, but row_group_ordinal is an i16 so this is a no-op (not sure if there is proper error handling when converting these to i16 further up?).

The main problem seems to be that prot.read_bytes here reads the length of the buffer as a Thrift varint, then passes the remaining buffer through to be decrypted, but for encrypted buffers the length is actually written as a 4 byte little-endian and the decrypt method also expects to receive a ciphertext buffer that includes the 4 length bytes.

With the following changes I can get to todo!("Decrypted page header!"):

diff --git a/parquet/src/encryption/ciphers.rs b/parquet/src/encryption/ciphers.rs
index 89515fe0e..fc0b703ba 100644
--- a/parquet/src/encryption/ciphers.rs
+++ b/parquet/src/encryption/ciphers.rs
@@ -194,7 +194,7 @@ fn create_module_aad(file_aad: &[u8], module_type: ModuleType, row_group_ordinal
     }
     if row_group_ordinal > i16::MAX {
         return Err(general_err!("Encrypted parquet files can't have more than {} row groups: {}",
-            u16::MAX, row_group_ordinal));
+            i16::MAX, row_group_ordinal));
     }
 
     if column_ordinal < 0 {
@@ -202,7 +202,7 @@ fn create_module_aad(file_aad: &[u8], module_type: ModuleType, row_group_ordinal
     }
     if column_ordinal > i16::MAX {
         return Err(general_err!("Encrypted parquet files can't have more than {} columns: {}",
-            u16::MAX, column_ordinal));
+            i16::MAX, column_ordinal));
     }
 
     if module_buf[0] != (ModuleType::DataPageHeader as u8) &&
@@ -218,10 +218,11 @@ fn create_module_aad(file_aad: &[u8], module_type: ModuleType, row_group_ordinal
     if page_ordinal < 0 {
         return Err(general_err!("Wrong page ordinal: {}", page_ordinal));
     }
-    if page_ordinal > i32::MAX {
+    if page_ordinal > (i16::MAX as i32){
         return Err(general_err!("Encrypted parquet files can't have more than {} pages in a chunk: {}",
-            u16::MAX, page_ordinal));
+            i16::MAX, page_ordinal));
     }
+    let page_ordinal = page_ordinal as i16;
 
     let mut aad = Vec::with_capacity(file_aad.len() + 7);
     aad.extend_from_slice(file_aad);
diff --git a/parquet/src/file/serialized_reader.rs b/parquet/src/file/serialized_reader.rs
index adf4aa07a..48796b09f 100644
--- a/parquet/src/file/serialized_reader.rs
+++ b/parquet/src/file/serialized_reader.rs
@@ -342,7 +342,6 @@ impl<R: 'static + ChunkReader> RowGroupReader for SerializedRowGroupReader<'_, R
 
 /// Reads a [`PageHeader`] from the provided [`Read`]
 pub(crate) fn read_page_header<T: Read>(input: &mut T, crypto_context: Option<Arc<CryptoContext>>) -> Result<PageHeader> {
-    let mut prot = TCompactInputProtocol::new(input);
     if let Some(crypto_context) = crypto_context {
         // let mut buf = [0; 16 * 1024];
         // let size = input.read(&mut buf)?;
@@ -354,19 +353,25 @@ pub(crate) fn read_page_header<T: Read>(input: &mut T, crypto_context: Option<Ar
 
         let aad = create_page_aad(
             aad_file_unique.as_slice(),
-            ModuleType::DictionaryPageHeader,
+            ModuleType::DataPageHeader,
             crypto_context.row_group_ordinal,
             crypto_context.column_ordinal,
             0,
         )?;
 
-        // todo: This currently fails, possibly due to wrongly generated AAD
-        let buf = file_decryptor.decrypt(prot.read_bytes()?.as_slice(), aad.as_ref());
+        let mut len_bytes = [0; 4];
+        input.read_exact(&mut len_bytes)?;
+        let ciphertext_len = u32::from_le_bytes(len_bytes) as usize;
+        let mut ciphertext = vec![0; 4 + ciphertext_len];
+        input.read_exact(&mut ciphertext[4..])?;
+        let buf = file_decryptor.decrypt(&ciphertext, aad.as_ref());
         todo!("Decrypted page header!");
         let mut prot = TCompactSliceInputProtocol::new(buf.as_slice());
         let page_header = PageHeader::read_from_in_protocol(&mut prot)?;
         return Ok(page_header)
     }
+
+    let mut prot = TCompactInputProtocol::new(input);
     let page_header = PageHeader::read_from_in_protocol(&mut prot)?;
     Ok(page_header)
 }

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @adamreeve! That was indeed the issue! :)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
parquet Changes to the parquet crate
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Parquet Modular Encryption support
5 participants